A Superlinearly-Convergent Proximal Newton-type Method for the Optimization of Finite Sums Contents
ثبت نشده
چکیده
3 Local convergence rate: simple case 2 3.1 Theorem statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3.2 Main estimate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 3.3 Convergence rate of the sequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 3.4 Proof of the theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
منابع مشابه
A Superlinearly-Convergent Proximal Newton-type Method for the Optimization of Finite Sums
We consider the problem of optimizing the strongly convex sum of a finite number of convex functions. Standard algorithms for solving this problem in the class of incremental/stochastic methods have at most a linear convergence rate. We propose a new incremental method whose convergence rate is superlinear – the Newtontype incremental method (NIM). The idea of the method is to introduce a model...
متن کاملSuperlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملProximal Quasi-Newton Methods for Nondifferentiable Convex Optimization
Some global convergence properties of a variable metric algorithm for minimization without exact line searches, in R. 23 superlinear convergent algorithm for minimizing the Moreau-Yosida regularization F. However, this algorithm makes use of the generalized Jacobian of F, instead of matrices B k generated by a quasi-Newton formula. Moreover, the line search is performed on the function F , rath...
متن کاملA QP-free constrained Newton-type method for variational inequality problems
We consider a simply constrained optimization reformulation of the KarushKuhn-Tucker conditions arising from variational inequalities. Based on this reformulation, we present a new Newton-type method for the solution of variational inequalities. The main properties of this method are: (a) it is well-defined for an arbitrary variational inequality problem, (b) it is globally convergent at least ...
متن کاملQuadratically and Superlinearly Convergent Algorithms for the Solution of Inequality Constrained Minimization Problems 1
In this paper some Newton and quasi-Newton algorithms for the solution of inequality constrained minimization problems are considered. All the algorithms described produce sequences fx k g converging q-superlinearly to the solution. Furthermore , under mild assumptions, a q-quadratic convergence rate in x is also attained. Other features of these algorithms are that the solution of linear syste...
متن کامل